A Theoretical Comparison of Two Linear Dimensionality Reduction Techniques
نویسندگان
چکیده
A theoretical analysis for comparing two linear dimensionality reduction (LDR) techniques, namely Fisher’s discriminant (FD) and Loog-Duin (LD) dimensionality reduciton, is presented. The necessary and sufficient conditions for which FD and LD provide the same linear transformation are discussed and proved. To derive these conditions, it is first shown that the two criteria preserve the same maximum value after a diagonalization process is applied, and then the necessary and sufficient conditions for various cases, including coincident covariance matrices, coincident prior probabilities, and for when one of the covariances is the identity matrix. A measure for comparing the two criteria is derived from the necessary and sufficient conditions, and used to empirically show that the conditions are statistically related to the classification error for a post-processing quadratic classifier and the Chernoff distance in the transformed space.
منابع مشابه
Impact of linear dimensionality reduction methods on the performance of anomaly detection algorithms in hyperspectral images
Anomaly Detection (AD) has recently become an important application of hyperspectral images analysis. The goal of these algorithms is to find the objects in the image scene which are anomalous in comparison to their surrounding background. One way to improve the performance and runtime of these algorithms is to use Dimensionality Reduction (DR) techniques. This paper evaluates the effect of thr...
متن کامل2D Dimensionality Reduction Methods without Loss
In this paper, several two-dimensional extensions of principal component analysis (PCA) and linear discriminant analysis (LDA) techniques has been applied in a lossless dimensionality reduction framework, for face recognition application. In this framework, the benefits of dimensionality reduction were used to improve the performance of its predictive model, which was a support vector machine (...
متن کاملA theoretical comparison of two-class Fisher's and heteroscedastic linear dimensionality reduction schemes
We present a theoretical analysis for comparing two linear dimensionality reduction (LDR) techniques for two classes, a homoscedastic LDR scheme, Fisher’s discriminant (FD), and a heteroscedastic LDR scheme, Loog-Duin (LD). We formalize the necessary and sufficient conditions for which the FD and LD criteria are maximized for the same linear transformation. To derive these conditions, we first ...
متن کاملLarge-scale SVD and manifold learning
This paper examines the efficacy of sampling-based low-rank approximation techniques when applied to large dense kernel matrices. We analyze two common approximate singular value decomposition techniques, namely the Nyström and Column sampling methods. We present a theoretical comparison between these two methods, provide novel insights regarding their suitability for various tasks and present ...
متن کامل